Semi-Supervised learning with Collaborative Bagged Multi-label K-Nearest-Neighbors
نویسندگان
چکیده
منابع مشابه
Semi-supervised Learning for Multi-label Classification
In this report we consider the semi-supervised learning problem for multi-label image classification, aiming at effectively taking advantage of both labeled and unlabeled training data in the training process. In particular, we implement and analyze various semi-supervised learning approaches including a support vector machine (SVM) method facilitated by principal component analysis (PCA), and ...
متن کاملClassiication with Learning K-nearest Neighbors
The nearest neighbor (NN) classiiers, especially the k-NN algorithm, are among the simplest and yet most eecient classiication rules and are widely used in practice. We introduce three adaptation rules that can be used in iterative training of a k-NN classiier. This is a novel approach both from the statistical pattern recognition and the supervised neural network learning points of view. The s...
متن کاملSemi-Supervised Multi-Label Learning with Incomplete Labels
The problem of incomplete labels is frequently encountered in many application domains where the training labels are obtained via crowd-sourcing. The label incompleteness significantly increases the difficulty of acquiring accurate multi-label prediction models. In this paper, we propose a novel semi-supervised multi-label method that integrates low-rank label matrix recovery into the manifold ...
متن کاملLabel Ranking with Semi-Supervised Learning
Label ranking is considered as an efficient approach for object recognition, document classification, recommendation task, which has been widely studied in recent years. It aims to learn a mapping from instances to a ranking list over a finite set of predefined labels. Traditional solutions for label rankings cannot obtain satisfactory results by only utilizing labeled data and ignore large amo...
متن کاملON SUPERVISED AND SEMI-SUPERVISED k-NEAREST NEIGHBOR ALGORITHMS
The k-nearest neighbor (kNN) is one of the simplest classification methods used in machine learning. Since the main component of kNN is a distance metric, kernelization of kNN is possible. In this paper kNN and semi-supervised kNN algorithms are empirically compared on two data sets (the USPS data set and a subset of the Reuters-21578 text categorization corpus). We use a soft version of the kN...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Open Computer Science
سال: 2019
ISSN: 2299-1093
DOI: 10.1515/comp-2019-0017